Low Complexity Damped Gauss--Newton Algorithms for CANDECOMP/PARAFAC

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Low Complexity Damped Gauss-Newton Algorithms for CANDECOMP/PARAFAC

The damped Gauss-Newton (dGN) algorithm for CANDECOMP/PARAFAC (CP) decomposition can handle the challenges of collinearity of factors and different magnitudes of factors; nevertheless, for factorization of an N-D tensor of size I1 × · · · × IN with rank R, the algorithm is computationally demanding due to construction of large approximate Hessian of size (RT × RT ) and its inversion where T = n...

متن کامل

Low complexity secant quasi-Newton minimization algorithms for nonconvex functions

In this work some interesting relations between results on basic optimization and algorithms for nonconvex functions (such as BFGS and secant methods) are pointed out. In particular, some innovative tools for improving our recent secant BFGS-type and LQN algorithms are described in detail. © 2006 Elsevier B.V. All rights reserved. MSC: 51M04; 65H20; 65F30; 90C53

متن کامل

A damped Gauss-Newton method for the second-order cone complementarity problem

We investigate some properties related to the generalized Newton method for the Fischer-Burmeister (FB) function over second-order cones, which allows us to reformulate the second-order cone complementarity problem (SOCCP) as a semismooth system of equations. Specifically, we characterize the B-subdifferential of the FB function at a general point and study the condition for every element of th...

متن کامل

On the Behavior of Damped Quasi-Newton Methods for Unconstrained Optimization

We consider a family of damped quasi-Newton methods for solving unconstrained optimization problems. This family resembles that of Broyden with line searches, except that the change in gradients is replaced by a certain hybrid vector before updating the current Hessian approximation. This damped technique modifies the Hessian approximations so that they are maintained sufficiently positive defi...

متن کامل

Practical Gauss-Newton Optimisation for Deep Learning

The curvature matrix depends on the specific optimisation method and will often be only an estimate. For notational simplicity, the dependence of f̂ on θ is omitted. Setting C to the true Hessian matrix of f would make f̂ the exact secondorder Taylor expansion of the function around θ. However, when f is a nonlinear function, the Hessian can be indefinite, which leads to an ill-conditioned quadra...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SIAM Journal on Matrix Analysis and Applications

سال: 2013

ISSN: 0895-4798,1095-7162

DOI: 10.1137/100808034